First Known Use: 1938
Dictionary
Markov chain
noun
Definition of MARKOV CHAIN
: a usually discrete stochastic process (as a random walk) in which the probabilities of occurrence of various future states depend only on the present state of the system or on the immediately preceding state and not on the path by which the present state was achieved —called also Markoff chain
ADVERTISEMENT
Origin of MARKOV CHAIN
A. A. Markov †1922 Russian mathematician
Learn More About MARKOV CHAIN
Browse
Next Word in the Dictionary: MarkovianPrevious Word in the Dictionary: markkaAll Words Near: Markov chain
ADVERTISEMENT
Seen & Heard
What made you want to look up Markov chain? Please tell us where you read or heard it (including the quote, if possible).